Asymptotics for Lasso - Type Estimators

نویسنده

  • Wenjiang Fu
چکیده

We consider the asymptotic behavior of regression estimators that minimize the residual sum of squares plus a penalty proportional to ∑ βj γ for some γ > 0. These estimators include the Lasso as a special case when γ = 1. Under appropriate conditions, we show that the limiting distributions can have positive probability mass at 0 when the true value of the parameter is 0. We also consider asymptotics for “nearly singular” designs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reweighting the Lasso

This paper investigates how changing the growth rate of the sequence of penalty weights affects the asymptotics of Lasso-type estimators. The cases of non-singular and nearly singular design are considered.

متن کامل

Differenced-Based Double Shrinking in Partial Linear Models

Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can ...

متن کامل

Parallelism, uniqueness, and large-sample asymptotics for the Dantzig selector.

The Dantzig selector (Candès and Tao, 2007) is a popular ℓ1-regularization method for variable selection and estimation in linear regression. We present a very weak geometric condition on the observed predictors which is related to parallelism and, when satisfied, ensures the uniqueness of Dantzig selector estimators. The condition holds with probability 1, if the predictors are drawn from a co...

متن کامل

High - Dimensional Econometrics and Model Selection

This dissertation consists of three chapters. Chapter 1 proposes a new method to solve the many moment problem: in Generalized Method of Moments (GMM), when the number of moment conditions is comparable to or larger than the sample size, the traditional methods lead to biased estimators. We propose a LASSO based selection procedure in order to choose the informative moments and then, using the ...

متن کامل

Bounds on the prediction error of penalized least squares estimators with convex penalty

This paper considers the penalized least squares estimator with arbitrary convex penalty. When the observation noise is Gaussian, we show that the prediction error is a subgaussian random variable concentrated around its median. We apply this concentration property to derive sharp oracle inequalities for the prediction error of the LASSO, the group LASSO and the SLOPE estimators, both in probab...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000